Feature Selection for SVMs

نویسندگان

  • Jason Weston
  • Sayan Mukherjee
  • Olivier Chapelle
  • Massimiliano Pontil
  • Tomaso A. Poggio
  • Vladimir Vapnik
چکیده

We introduce a method of feature selection for Support Vector Machines. The method is based upon finding those features which minimize bounds on the leave-one-out error. This search can be efficiently performed via gradient descent. The resulting algorithms are shown to be superior to some standard feature selection algorithms on both toy data and real-life problems of face recognition, pedestrian detection and analyzing DNA microarray data.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

تعیین ماشین‌های بردار پشتیبان بهینه در طبقه‌بندی تصاویر فرا طیفی بر مبنای الگوریتم ژنتیک

Hyper spectral remote sensing imagery, due to its rich source of spectral information provides an efficient tool for ground classifications in complex geographical areas with similar classes. Referring to robustness of Support Vector Machines (SVMs) in high dimensional space, they are efficient tool for classification of hyper spectral imagery. However, there are two optimization issues which s...

متن کامل

Classification of Polarimetric SAR Images Based on Optimum SVMs Classifier Using Bees Algorithm

Because of Polarimetric Synthetic Aperture Radar (PolSAR) contains the different features which relate to the physical properties of the terrain in unique ways, polarimetric imagery provides an efficient tool for the classification of the complex geographical areas. Support Vector Machines (SVMs) are particularly attractive in the remote sensing field due to their ability to handle the nonlinea...

متن کامل

Classification of Brain Glioma by Using SVMs Bagging with Feature Selection

The degree of malignancy in brain glioma needs to be assessed by MRI findings and clinical data before operations. There have been previous attempts to solve this problem by using fuzzy max-min neural networks and support vector machines (SVMs), while in this paper, a novel algorithm named PRIFEB is proposed by combining bagging of SVMs with embedded feature selection for its individuals. PRIFE...

متن کامل

¡ º ¬ º ³¢è Combining Svms with Various Feature Selection Strategies § Ï×ö Ø ¼ ¼ § Á¡°¼ Ö ÷ Ç Combining Svms with Various Feature Selection Strategies

Feature selection is an important issue in many research areas. There are some reasons for selecting important features such as reducing the learning time, improving the accuracy, etc. This thesis investigates the performance of combining support vector machines (SVM) and various feature selection strategies. The first part of the thesis mainly describes the existing feature selection methods a...

متن کامل

Analytic Feature Selection for Support Vector Machines

Support vector machines (SVMs) rely on the inherent geometry of a data set to classify training data. Because of this, we believe SVMs are an excellent candidate to guide the development of an analytic feature selection algorithm, as opposed to the more commonly used heuristic methods. We propose a filter-based feature selection algorithm based on the inherent geometry of a feature set. Through...

متن کامل

Feature Selection for Fast Image Classification with Support Vector Machines

According to statistical learning theory, we propose a feature selection method using support vector machines (SVMs). By exploiting the power of SVMs, we integrate the two tasks, feature selection and classifier training, into a single consistent framework and make the feature selection process more effective. Our experiments show that our SVM feature selection method can speed up the classific...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000